# Persian Language Understanding

Tookabert Base
Apache-2.0
TookaBERT is a family of encoder models trained on Persian, including base and large versions, suitable for various natural language processing tasks.
Large Language Model Transformers Other
T
PartAI
127
24
Fabert
BERT pretrained model based on Persian blog training, excelling in multiple Persian NLP tasks
Large Language Model Transformers Other
F
sbunlp
627
15
Albert Fa Base V2 Ner Peyma
Apache-2.0
The first ALBERT model specifically for Persian, based on Google's ALBERT base v2.0 architecture, trained on diverse Persian corpora
Large Language Model Transformers Other
A
m3hrdadfi
19
1
Albert Fa Base V2 Sentiment Deepsentipers Binary
Apache-2.0
A lightweight BERT model for self-supervised language representation learning in Persian
Large Language Model Transformers Other
A
m3hrdadfi
25
0
Albert Fa Base V2 Sentiment Digikala
Apache-2.0
A lightweight BERT model for self-supervised language representation learning in Persian
Large Language Model Transformers Other
A
m3hrdadfi
18
0
Bert Fa Zwnj Base
Apache-2.0
A Persian language understanding model based on Transformer architecture, capable of handling zero-width non-joiner issues in Persian writing
Large Language Model Other
B
HooshvareLab
5,590
15
Bert Base Parsbert Uncased
Persian language understanding model based on Transformer architecture, outperforming multilingual BERT and other hybrid models
Large Language Model
B
HooshvareLab
99.81k
38
Bert Fa Base Uncased
Apache-2.0
Transformer-based Persian language understanding model, reconstructed vocabulary and fine-tuned on new corpora, supports multiple downstream tasks
Large Language Model Other
B
HooshvareLab
19.57k
18
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase